random segmentation

Terms from Artificial Intelligence: humans at the heart of algorithms

Page numbers are for draft copy at present; they will be replaced with correct numbers when final book is formatted. Chapter numbers are correct and will not change now.

Random segmention is a way to break big data into subsets small enough to process. Rather than using a systematic segmetation rule, data is divided randomly into subsets of the desired size. As well as bieng simple to implement, it can have statistical advantages as it ensures differet kinds of item are spread uniformly acrss the data segments.

Used on Chap. 8: page 162